This means ensuring your business appears when a potential customer searches for relevant products or services "near me" without necessarily including a specific location in their search terms. Ensuring that your GMB profile is complete with up-to-date information can significantly influence customer perceptions and enhance visibility in search results. read about the best Improve Local Search Ranking for Distributors Optimizing Google My Business ProfilesSetting up and meticulously maintaining a GMB profile is crucial due to its direct impact on local search visibility.
Continuous Monitoring and AdaptationMaintaining top performance in local SEO requires ongoing attention to changes in search algorithms and shifts in consumer behavior patterns. It's about making ongoing adjustments based on concrete data rather than assumptions about what might lead to better user experiences or enhanced SEO outcomes.
Multi-location businesses need to create individual pages for each location to capture localized traffic effectively without diluting the main site's relevance to broader searches. This ensures that content and metadata are relevant to the regional audience, increasing the relevance of your site in local search results. Improve Local Search Ranking for Distributors
For multi-location businesses, creating distinct localized pages for each area ensures that each location maintains its unique presence online. Integrating Reviews into Broader SEO EffortsBeyond mere collection and management, integrating customer feedback into a broader SEO strategy amplifies its benefits.
Consider collaborating with local chambers of commerce or professional groups related to your wholesale niche. By sharing content that includes links to your website, you increase the likelihood of this content being shared further or picked up by bloggers and other websites who might link back to you organically. Understanding the Impact of Mobile Optimization on Local SearchWebsite Localization for Enhanced Local SearchLocal SEO optimization begins fundamentally with the localization of your website. Effective GMB IntegrationA well-optimized Google My Business (GMB) profile is essential for successful local SEO strategies. It ensures that your business appears prominently in local searches and Google Maps. Monitoring review sites helps identify operational issues early before they affect wider perceptions of your brand. Proactive Reputation ManagementManaging online reviews actively contributes to building trust amongst prospective clients who often consult these reviews before making purchase decisions. For example, if specializing in construction materials, articles could discuss regional building projects or changes in local construction regulations-providing valuable information that aligns with what locals are searching for.
This means that for wholesalers looking to tap into specific markets, relying solely on traditional organic search strategies may not be sufficient. Wholesale Product SEO Encouraging satisfied customers to leave positive reviews while tactfully responding to negative feedback ensures high ratings which boost both visibility in the “Local Pack” results as well as consumer trustworthiness toward their service providers. This approach extends to crafting localized content on your Google My Business (GMB) profile and website text, ensuring alignment with consumer search behaviors and trends. Hosting videos on popular platforms like YouTube and linking back to your business's website can drive additional traffic and improve search rankings by creating high-quality backlinks. Creating Content That Ranks: Tips for Wholesale DistributorsUnderstanding the Importance of Local SEO for Wholesale DistributorsWholesale distributors seeking to enhance their online presence must recognize the transformative role of local SEO. Engaging in guest blogging can also be an effective strategy; by providing valuable content on other platforms, you can earn a spot for your link in a natural context. Essential On-Page SEO Techniques for Distribution WebsitesWebsite LocalisationFor wholesale distributors, localising your website effectively is crucial.
Additionally, audits include scrutiny of Google Business Profiles and detailed keyword research to gauge competitive standings and identify actionable opportunities for improvement. Initially, local SEO simply involved incorporating geographical locations into search terms. Optimizing Google My BusinessAn optimized Google Business Profile (GMB) is crucial for successful local SEO. Through advanced keyword research tools, we identify optimal keywords that align with the geographical nuances of your market. This is essential not only in the footer and contact page but also throughout metadata such as H1 and H2 headers. Whether it's through regular blog posts about local events or specialized hubs like moving guides tailored for specific areas, such content helps establish your site as a credible source for local information-thereby improving both reach and reputation. Using advanced tools, this process identifies optimal keywords for targeting within website content and Google My Business (GMB) profiles, ensuring alignment with consumer search behaviors and trends.
The Power of Local Keyword ResearchConducting thorough keyword research is imperative in crafting an effective local SEO strategy. Whether it's through regular blog posts about local events or specialized content hubs like a ‘moving house guide' for real estate clients, this approach helps build a credible and engaging online presence. Utilizing analytics tools to track views, watch time, click-through rates from videos to websites, and conversion metrics provides insights into how effectively video content contributes towards achieving SEO goals. Local Blogging Tactics to Drive More Traffic to Your Distribution SiteLocal SEO AuditA comprehensive local SEO audit is the first step towards enhancing your distribution site's visibility. In effect this means that integrating video marketing into a comprehensive Local SEO strategy enables wholesale distributors not just stand out among competitors but also attract more qualified traffic through improved ranking positions within localized searches.21. In effect this means understanding mobile optimization's role within Local SEO demands attention across multiple facets-from precise localization efforts on your web presence to dynamic content creation aimed at engaging a geographically targeted audience-all while continuously adapting strategies based on comprehensive performance data. Establishing content hubs around these topics helps create a comprehensive resource that enhances user engagement and improves search rankings due to its relevance and utility.
Engagement Through Google My Business PostsUtilizing GMB posts allows you to communicate directly with a localized audience by sharing relevant news updates or promotions which further enhances engagement and visibility within the community you serve. Tools like Google's Search Console offer ways to check how much traffic is being generated from these rich snippets and whether errors in markup implementation need rectification. Utilize Google My Business PostsRegularly update your GMB profile with posts about promotions, company news, or relevant community activities. By focusing on these strategic elements when building backlinks, wholesalers can improve their online visibility significantly-leading not just to increased web traffic but potentially boosting sales as well. How to Optimize Your Wholesale Distributor Website for Local SEOLOCAL SEO AUDITA thorough local SEO audit is the foundation of any effective strategy for wholesale distributors. Long-tail keywords are typically more specific phrases that customers are likely to use when they are closer to a point of purchase or when they are using voice search. This process not only serves as a health check but also highlights areas for improvement and opportunities for growth in local search visibility.
For instance, if you're a wholesale distributor with multiple branches across regions, properly structured location data can help display the correct branch information directly in SERPs when someone searches for suppliers near them. OPTIMISING YOUR GOOGLE BUSINESS PROFILEA well-managed Google Business Profile (GMB) enhances how you're perceived by both Google and potential customers using local searches. Leveraging Local PartnershipsLocal partnerships offer a significant opportunity for building backlinks as well as strengthening community ties. Conduct Thorough Local Keyword ResearchUnderstanding what potential customers are searching for locally is crucial for effective online directory management. In effect this means that implementing these strategic steps ensures that your wholesale distributor business doesn't just participate but stands out in localized digital environments-securing both visibility and viability in your targeted regions. For instance, real estate sites might feature neighborhood guides or lists of local services which not only serve user needs but also improve local search rankings by keeping content fresh and regionally targeted.
Continuous Improvement Based On AnalyticsLastly, consistent monitoring through analytics allows wholesale distributors to understand user behaviors better; adapt strategies based on what's working or not; identify trends; optimize accordingly-all aimed at maintaining high local SEO ranks. Monitoring Your ProgressTo put it short, regularly monitoring the impact of your backlinking strategy is essential. Proactive review management involves encouraging satisfied customers to share their positive experiences while addressing any negative feedback promptly and professionally-keeping your digital reputation intact. This profile aids potential customers in finding your distribution centers via Google Maps and Local Pack listings quickly. Regular updates and posts on your GMB profile keep your business relevant, while accurate and comprehensive business information builds trust with potential customers. Building Local Links and CitationsEstablishing a network of local citations is crucial; these references boost local search rankings by verifying the geographical presence of the business across various platforms like directories and social media profiles. SEO Conversion Rate Optimization
Crafting Targeted Local ContentCreating relevant and engaging content localized for specific audiences boosts both relevance and engagement. It also includes auditing inbound links and Google Business Profiles while summarizing on-page performance and undertaking local keyword research. In effect this means understanding the intricate balance between providing an exceptional user experience while simultaneously optimizing technical SEO elements is crucial for any wholesale distributor looking to succeed in today's digital marketplace.20. The Role of Customer Reviews in Local SEO for DistributorsUnderstanding the Impact of Customer ReviewsCustomer reviews are integral to local SEO for distributors, serving as a direct line of communication between consumers and businesses. Localized Content StrategyFor wholesale distributors operating in specific regions or multiple locations, localized content tailored to meet regional needs can significantly enhance visibility in local search results. Reputation Management through ReviewsManaging online reputation through proactive collection of customer feedback on Google reviews strengthens trustworthiness among potential clients who often read these reviews before making contact.
Utilizing Google Posts EffectivelyGoogle Posts allow businesses to share timely messages directly on their GMB profile-a feature especially useful for promoting special offers or news via voice search platforms. Generating Impactful Local ContentCreating relevant and engaging content that resonates with a local audience can significantly enhance visibility and engagement rates. Tracking Impact and EngagementLastly, measuring the impact of video content on your local SEO efforts is essential for ongoing optimization. For wholesalers operating in multiple locations, rather than localizing the main website for a specific location, creating individual location pages optimized for their respective areas is more beneficial. Initially, it involved simply including geographic locations within search queries. Utilizing advanced tools helps pinpoint precise keywords that align with user intent in specific locales. Leveraging Social Proof through ReviewsOnline reviews are vital as they directly influence purchasing decisions and SEO rankings alike. In effect this means leveraging every tool available-from comprehensive audits and targeted content strategies to systematic citation building-is essential in driving successful wholesale distribution through strategic local SEO efforts. Local Keyword ResearchUnderstanding what potential customers are searching locally forms the backbone of any on-page SEO strategy.
For businesses like real estate agencies, offering comprehensive guides about moving to the area can be particularly effective at drawing traffic from potential new residents. For wholesalers, this might involve creating detailed guides on services provided in specific areas or articles highlighting community involvement or local events. This audit will delve into various aspects such as your current local search rankings, citation overview, and reputation management strategy. A well-optimized GMB profile ensures that your business appears prominently in local search results, Google Maps, and the Local Pack. Reputation Management StrategyProactively managing online reputation involves encouraging satisfied customers to share their positive experiences while addressing any negative feedback swiftly and constructively. Since posts are displayed prominently in both Maps and regular Google searches, they offer an excellent opportunity for increasing visibility among users who rely on spoken queries. This audit not only evaluates on-page performance but also explores local keyword effectiveness and competitive landscape, providing a comprehensive snapshot of where you stand and what actions to take next. Enhancing Local Search RankingsThe quality and quantity of customer reviews contribute to the reputation signals utilized by search algorithms.
KEYWORD RESEARCH FOR LOCAL SEARCHESUnderstanding what potential customers are searching for locally is critical to targeting them effectively. Rich snippets could include star ratings from reviews, prices of products if applicable, or even breadcrumbs that show where the business fits within larger categories or hierarchies. Encouraging satisfied customers to leave positive feedback goes a long way toward enhancing your online prominence among local competitors. Whether through regular blog posts about local events or creating content hubs focused on regional specifics like real estate guides or lists of local amenities, these efforts boost both the relevancy and reach of your website. This can help to improve your overall search engine rankings. Strategic Content CreationContent tailored to the local audience significantly boosts a website's relevance in specific geographic areas.
Frequent updates ensure that your business remains relevant and engaging to potential customers browsing locally. Using advanced tools, we determine optimal keywords that align with both what users are searching for and the geographical nuances of your offerings. Optimizing Your Google My Business ProfileA well-optimized Google My Business (GMB) profile is invaluable for being found via voice search since many of these inquiries pull information directly from GMB listings.
CONTENT CREATION WITH A LOCAL FOCUSDeveloping content that speaks directly to a local audience significantly boosts engagement and relevance in search results. Utilize advanced local keyword research tools to identify how best to optimize your site's local relevance.
Regular updates and adaptations to these keywords are necessary due to the ever-evolving nature of consumer behaviors and search engine algorithms. Business Profile Optimization Actively managing reviews on platforms like Google My Business not only boosts your reputation but also signals reliability to search engines that prioritize credible businesses in their algorithms.
This forms the foundation upon which localized content strategies are built, ensuring that every aspect of your online presence speaks directly to a local audience. Local Keyword ResearchUnderstanding what potential customers are searching locally forms the backbone of a targeted SEO strategy.
Website Localization TechniquesLocalizing your website effectively is crucial for wholesale distributors with multiple locations. Actively managing your reputation by encouraging positive reviews, addressing negative feedback constructively, and monitoring review patterns can significantly influence public perception and bolster trustworthiness among potential customers. Unlike traditional SEO, local optimization focuses on appearing in search results specific to the geographical area of your business. It enhances your presence in both Maps and Local Pack queries, which are prominent for users searching for suppliers nearby. Utilizing advanced tools, we pinpoint relevant local keywords that align closely with user intent and regional characteristics. Additionally, quality backlinks can drive direct traffic from other sites and enhance your brand's visibility. A robust profile of positive reviews instills confidence in prospective clients, potentially swaying their decision-making process favorably towards your distribution business over competitors lacking similar validation. By optimizing their online presence through local SEO tactics, distributors can significantly increase their chances of appearing in coveted positions within localized search results and map packs. Regular reports help keep track of progress made towards set objectives while allowing room for swift pivots wherever necessary-ensuring that the strategy remains aligned with evolving market conditions or consumer behaviors.
Regular updates and adjustments in the keyword strategy are necessary due to constantly changing consumer behaviors and algorithm updates by search engines. It's also beneficial to embed the videos on relevant pages throughout your website to enhance page visibility and engagement rates further. Optimizing Your Google My Business ProfileA well-optimized Google My Business profile is essential for effective Local SEO. It also involves a detailed analysis of your Google Business Profile and on-page performance. Distributors must implement effective strategies for soliciting feedback post-purchase or service fulfillment. Our strategy includes creating content hubs centered around location pages which could feature blogs on local events or articles about nearby landmarks or services. GMB PROFILE SETUP AND OPTIMISATIONA well-optimized Google Business Profile enhances how your business appears in local searches, maps, and other Google services. LOCAL KEYWORD RESEARCHIdentifying what potential customers are searching locally forms the crux of any SEO campaign.
This research guides content creation on your website and enhances profiles like Google My Business (GMB), ensuring they resonate with local search trends and behaviors which keep evolving. Scheduled posts about local events or news can increase engagement from nearby consumers who might need wholesale services frequently. These localized search results prioritize businesses with higher ratings and more frequent reviews, thereby improving their accessibility to potential customers in the vicinity. A well-maintained GMB profile enhances how your business appears in maps, local packs, and organic searches which significantly increases both foot traffic and online interactions from potential customers. These measurements help refine strategies by understanding what types of video content resonate most with local audiences ensuring future campaigns are even more successful. This could range from blogs covering local news or events to creating comprehensive guides pertinent to the area served-such as resources available in a community for new residents.
Website Localization TechniquesFor effective local SEO, it is essential to integrate localization elements across your website. Additionally, utilizing features like Google Posts allows businesses to communicate directly with their local customer base about offers, events, and updates. Effective SEO Strategies for Wholesale DistributorsLOCAL SEO AUDITFor wholesale distributors, conducting a thorough Local SEO audit is the starting point for enhancing online visibility.
For wholesale distributors, this means diving deep into keyword research tailored to geographical nuances and integrating these terms strategically throughout their online content-from website text to Google My Business (GMB) profiles. Monitoring Performance and AdjustmentsContinuous monitoring allows adjustments based on detailed analytics concerning how different aspects of an SEO strategy perform over time against competitors' actions or updates in search algorithms by platforms like Google itself.
In effect this means reviewing and refining these facets regularly will keep you competitive within the dynamic landscape of Local SEO; thereby potentially improving both online visibility and sales outcomes as a wholesaler targeting regional markets.22. These posts provide fresh content that keeps potential customers informed and connected to your brand.
Utilizing advanced keyword research tools allows you to pinpoint specific local terms that are most relevant to your business and industry. A robust local SEO strategy ensures that a business remains visible in key digital areas without needing location-specific queries from users.
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid traffic (known as "natural" or "organic" results) rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher on the search engine results page (SERP). These visitors can then potentially be converted into customers.[4]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider/crawler crawls a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[8][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[9] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[10]
By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[11] Since the success and popularity of a search engine are determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[12] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[13] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[14]
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[15][16] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[17] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[18]
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[19] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[20] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[21] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[22]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[23] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[24] Patents related to search engines can provide information to better understand search engines.[25] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[26]
In 2007, Google announced a campaign against paid links that transfer PageRank.[27] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[28] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[29]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[30] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[31] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[32]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[33] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[34] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[35] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[36] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[37] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[38] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[39] in addition to their URL submission console.[40] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[41] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[42]
Mobile devices are used for the majority of Google searches.[43] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[44] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[45] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[46]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47] In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint not a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[48]
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[49] Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[50] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[49]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[51] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[52]
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[15][16][53] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[54] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[55] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[56]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58][59] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[60] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[61] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[49]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[62] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[63] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[64] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[65] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[66] As of 2006, Google had an 85–90% market share in Germany.[67] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia, and the Czech Republic, where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[67]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[69][70]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[71][72]
{{cite web}}
: CS1 maint: multiple names: authors list (link)
This article needs additional citations for verification. (November 2009) |
Search analytics is the use of search data to investigate particular interactions among Web searchers, the search engine, or the content during searching episodes.[1] The resulting analysis and aggregation of search engine statistics can be used in search engine marketing (SEM) and search engine optimization (SEO). In other words, search analytics helps website owners understand and improve their performance on search engines based on the outcome. For example, identifying highly valuable site visitors[2] or understanding user intent.[3] Search analytics includes search volume trends and analysis, reverse searching (entering websites to see their keywords), keyword monitoring, search result and advertisement history, advertisement spending statistics, website comparisons, affiliate marketing statistics, multivariate ad testing, etc.[4]
Search analytics data can be collected in several ways. Search engines provide access to their own data with services such as Google Analytics,[5] Google Trends, and Google Insights. Third-party services must collect their data from ISP's, phoning home software, or from scraping search engines. Getting traffic statistics from ISP's and phone homes provides for broader reporting of web traffic in addition to search analytics. Services that perform keyword monitoring only scrape a limited set of search results, depending on their clients' needs. Services providing reverse search, however, must scrape a large set of keywords from the search engines, usually in the millions, to find the keywords that everyone is using.[6]
Since search results, especially advertisements, differ depending on where you are searching from, data collection methods have to account for geographic location. Keyword monitors do this more easily since they typically know what location their client is targeting. However, to get an exhaustive reverse search, several locations need to be scraped for the same keyword.
Search analytics accuracy depends on service being used, data collection method, and data freshness. Google releases its own data, but only in an aggregated way and often without assigning absolute values such as number of visitors to its graphs.[7] ISP logs and phone home methods are accurate for the population they sample, so sample size and demographics must be adequate to accurately represent the larger population. Scraping results can be highly accurate, especially when looking at the non-paid, organic search results. Paid results, from Google AdWords for example,[8] are often different for the same search depending on the time, geographic location, and history of searches from a particular computer. This means that scraping advertisers can be hit or miss.
Taking a look at Google Insights to gauge the popularity of these services shows that compared to searches for the term AdWords (Google's popular search ad system), use of search analytics services is still very low, around 1-25% as of Oct. 2009.[9] This could point to a large opportunity for the users and makers of search analytics given that services have existed since 2004 with several new services being started since.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Social media optimization (SMO) is the use of online platforms to generate income or publicity to increase the awareness of a brand, event, product or service. Types of social media involved include RSS feeds, blogging sites, social bookmarking sites, social news websites, video sharing websites such as Youtube and social networking sites such as Facebook, Instagram, Tiktok and X(Twitter). SMO is similar to search engine optimization (SEO) in that the goal is to drive web traffic, and draw attention to a company or creator. SMO's focal point is on gaining organic links to social media content. In contrast, SEO's core is about reaching the top of the search engine hierarchy.[1] In general, social media optimization refers to optimizing a website and its content to encourage more users to use and share links to the website across social media and networking sites.[2]
SMO is used to strategically create online content ranging from well-written text to eye-catching digital photos or video clips that encourages and entices people to engage with a website. Users share this content, via its weblink, with social media contacts and friends. Common examples of social media engagement are "liking and commenting on posts, retweeting, embedding, sharing, and promoting content".[3] Social media optimization is also an effective way of implementing online reputation management (ORM), meaning that if someone posts bad reviews of a business, an SMO strategy can ensure that the negative feedback is not the first link to come up in a list of search engine results.[4]
In the 2010s, with social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating web traffic. Publishers such as The Economist employ large social media teams to optimize their online posts and maximize traffic,[5] while other major publishers now use advanced artificial intelligence (AI) technology to generate higher volumes of web traffic.[6]
Social media optimization is an increasingly important factor in search engine optimization, which is the process of designing a website in a way so that it has as high a ranking as possible on search engines. Search engines are increasingly utilizing the recommendations of users of social networks such as Reddit, Facebook, Tumblr, Twitter, YouTube, LinkedIn, Pinterest and Instagram to rank pages in the search engine result pages.[7] The implication is that when a webpage is shared or "liked" by a user on a social network, it counts as a "vote" for that webpage's quality. Thus, search engines can use such votes accordingly to properly ranked websites in search engine results pages. Furthermore, since it is more difficult to tip the scales or influence the search engines in this way, search engines are putting more stock into social search.[7] This, coupled with increasingly personalized search based on interests and location, has significantly increased the importance of a social media presence in search engine optimization. Due to personalized search results, location-based social media presences on websites such as Yelp, Google Places, Foursquare, and Yahoo! Local have become increasingly important. While social media optimization is related to search engine marketing, it differs in several ways. Primarily, SMO focuses on driving web traffic from sources other than search engines, though improved search engine ranking is also a benefit of successful social media optimization. Further, SMO is helpful to target particular geographic regions in order to target and reach potential customers. This helps in lead generation (finding new customers) and contributes to high conversion rates (i.e., converting previously uninterested individuals into people who are interested in a brand or organization).
Social media optimization is in many ways connected to the technique of viral marketing or "viral seeding" where word of mouth is created through the use of networking in social bookmarking, video and photo sharing websites. An effective SMO campaign can harness the power of viral marketing; for example, 80% of activity on Pinterest is generated through "repinning."[citation needed] Furthermore, by following social trends and utilizing alternative social networks, websites can retain existing followers while also attracting new ones. This allows businesses to build an online following and presence, all linking back to the company's website for increased traffic. For example, with an effective social bookmarking campaign, not only can website traffic be increased, but a site's rankings can also be increased. In a similar way, the engagement with blogs creates a similar result by sharing content through the use of RSS in the blogosphere. Social media optimization is considered an integral part of an online reputation management (ORM) or search engine reputation management (SERM) strategy for organizations or individuals who care about their online presence.[8] SMO is one of six key influencers that affect Social Commerce Construct (SCC). Online activities such as consumers' evaluations and advices on products and services constitute part of what creates a Social Commerce Construct (SCC).[citation needed]
Social media optimization is not limited to marketing and brand building. Increasingly, smart businesses are integrating social media participation as part of their knowledge management strategy (i.e., product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more). Additionally, social media optimization can be implemented to foster a community of the associated site, allowing for a healthy business-to-consumer (B2C) relationship.[9]
According to technologist Danny Sullivan, the term "social media optimization" was first used and described by marketer Rohit Bhargava[10][11] on his marketing blog in August 2006. In the same post, Bhargava established the five important rules of social media optimization. Bhargava believed that by following his rules, anyone could influence the levels of traffic and engagement on their site, increase popularity, and ensure that it ranks highly in search engine results. An additional 11 SMO rules have since been added to the list by other marketing contributors.
The 16 rules of SMO, according to one source, are as follows:[12]
Bhargava's initial five rules were more specifically designed to SMO, while the list is now much broader and addresses everything that can be done across different social media platforms. According to author and CEO of TopRank Online Marketing, Lee Odden, a Social Media Strategy is also necessary to ensure optimization. This is a similar concept to Bhargava's list of rules for SMO.
The Social Media Strategy may consider:[13]
According to Lon Safko and David K. Brake in The Social Media Bible, it is also important to act like a publisher by maintaining an effective organizational strategy, to have an original concept and unique "edge" that differentiates one's approach from competitors, and to experiment with new ideas if things do not work the first time.[4] If a business is blog-based, an effective method of SMO is using widgets that allow users to share content to their personal social media platforms. This will ultimately reach a wider target audience and drive more traffic to the original post. Blog widgets and plug-ins for post-sharing are most commonly linked to Facebook, LinkedIn and x.com. They occasionally also link to social media platforms such as Tumblr and Pinterest. Many sharing widgets also include user counters which indicate how many times the content has been liked and shared across different social media pages. This can influence whether or not new users will engage with the post, and also gives businesses an idea of what kind of posts are most successful at engaging audiences. By using relevant and trending keywords in titles and throughout blog posts, a business can also increase search engine optimization and the chances of their content of being read and shared by a large audience.[13] The root of effective SMO is the content that is being posted, so professional content creation tools can be very beneficial. These can include editing programs such as Photoshop, GIMP, Final Cut Pro, and Dreamweaver. Many websites also offer customization options such as different layouts to personalize a page and create a point of difference.[4]
With social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating traffic. A report by Reuters Institute for the Study of Journalism described how a 'second wave of disruption' had hit news organizations,[14] with publishers such as The Economist having to employ large social media teams to optimize their posts, and maximize traffic.[5] Within the context of the publishing industry, even professional fields are utilizing SMO. Because doctors want to maximize exposure to their research findings SMO has also found a place in the medical field.[15]
Today, 3.8 billion people globally are using some form of social media.[citation needed] People frequently obtain health-related information from online social media platforms like Twitter and Facebook. Healthcare professionals and scientists can communicate with other medical-counterparts to discuss research and findings through social media platforms. These platforms provide researchers with data sets and surveillance that help detect patterns and behavior in preventing, informing, and studying global disease; COVID-19. Additionally, researchers utilize SMO to reach and recruit hard-to-reach patients. SMO narrows specified demographics that filter necessary data in a given study.[citation needed]
Social media gaming is online gaming activity performed through social media sites with friends and online gaming activity that promotes social media interaction. Examples of the former include FarmVille, Clash of Clans, Clash Royale, FrontierVille, and Mafia Wars. In these games a player's social network is exploited to recruit additional players and allies. An example of the latter is Empire Avenue, a virtual stock exchange where players buy and sell shares of each other's social network worth. Nielsen Media Research estimates that, as of June 2010, social networking and playing online games account for about one-third of all online activity by Americans.[16]
Facebook has in recent years become a popular channel for advertising, alongside traditional forms such as television, radio, and print. With over 1 billion active users, and 50% of those users logging into their accounts every day[17] it is an important communication platform that businesses can utilize and optimize to promote their brand and drive traffic to their websites. There are three commonly used strategies to increase advertising reach on Facebook:
Improving effectiveness and increasing network size are organic approaches, while buying more reach is a paid approach which does not require any further action.[18] Most businesses will attempt an "organic" approach to gaining a significant following before considering a paid approach. Because Facebook requires a login, it is important that posts are public to ensure they will reach the widest possible audience. Posts that have been heavily shared and interacted with by users are displayed as 'highlighted posts' at the top of newsfeeds. In order to achieve this status, the posts need to be engaging, interesting, or useful. This can be achieved by being spontaneous, asking questions, addressing current events and issues, and optimizing trending hashtags and keywords. The more engagement a post receives, the further it will spread and the more likely it is to feature on first in search results.
Another organic approach to Facebook optimization is cross-linking different social platforms. By posting links to websites or social media sites in the profile 'about' section, it is possible to direct traffic and ultimately increase search engine optimization. Another option is to share links to relevant videos and blog posts.[13] Facebook Connect is a functionality that launched in 2008 to allow Facebook users to sign up to different websites, enter competitions, and access exclusive promotions by logging in with their existing Facebook account details. This is beneficial to users as they don't have to create a new login every time they want to sign up to a website, but also beneficial to businesses as Facebook users become more likely to share their content. Often the two are interlinked, where in order to access parts of a website, a user has to like or share certain things on their personal profile or invite a number of friends to like a page. This can lead to greater traffic flow to a website as it reaches a wider audience. Businesses have more opportunities to reach their target markets if they choose a paid approach to SMO. When Facebook users create an account, they are urged to fill out their personal details such as gender, age, location, education, current and previous employers, religious and political views, interests, and personal preferences such as movie and music tastes. Facebook then takes this information and allows advertisers to use it to determine how to best market themselves to users that they know will be interested in their product. This can also be known as micro-targeting. If a user clicks on a link to like a page, it will show up on their profile and newsfeed. This then feeds back into organic social media optimization, as friends of the user will see this and be encouraged to click on the page themselves. Although advertisers are buying mass reach, they are attracting a customer base with a genuine interest in their product. Once a customer base has been established through a paid approach, businesses will often run promotions and competitions to attract more organic followers.[12]
The number of businesses that use Facebook to advertise also holds significant relevance. in 2017, there were three million businesses that advertised on Facebook.[19] This makes Facebook the world's largest platform for social media advertising. What also holds importance is the amount of money leading businesses are spending on Facebook advertising alone. Procter & Gamble spend $60 million every year on Facebook advertising.[20] Other advertisers on Facebook include Microsoft, with a yearly spend of £35 million, Amazon, Nestle and American Express all with yearly expenditures above £25 million per year.
Furthermore, the number of small businesses advertising on Facebook is of relevance. This number has grown rapidly over the upcoming years and demonstrates how important social media advertising actually is. Currently 70% of the UK's small businesses use Facebook advertising.[21] This is a substantial number of advertisers. Almost half of the world's small businesses use social media marketing product of some sort. This demonstrates the impact that social media has had on the current digital marketing era.
ER (Engagement Rate) represents the activity of users specific for a certain profile on Facebook, Instagram, Tiktok or any other Social Media. A common way to calculate it is the following:
In the above formula followers is the total number of followers (friends, subscribers, etc), interactions stands for the number of interactions, such as likes, comments, personal messages, shares. The latter is averaged over the certain period of time, which should normally be short enough to ensure the variance in followers number is negligible during this period.
Social media optimization.
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid traffic (known as "natural" or "organic" results) rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher on the search engine results page (SERP). These visitors can then potentially be converted into customers.[4]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider/crawler crawls a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[8][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[9] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[10]
By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[11] Since the success and popularity of a search engine are determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[12] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[13] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[14]
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[15][16] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[17] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[18]
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[19] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[20] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[21] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[22]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[23] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[24] Patents related to search engines can provide information to better understand search engines.[25] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[26]
In 2007, Google announced a campaign against paid links that transfer PageRank.[27] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[28] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[29]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[30] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[31] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[32]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[33] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[34] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[35] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[36] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[37] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[38] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[39] in addition to their URL submission console.[40] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[41] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[42]
Mobile devices are used for the majority of Google searches.[43] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[44] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[45] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[46]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47] In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint not a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[48]
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[49] Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[50] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[49]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[51] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[52]
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[15][16][53] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[54] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[55] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[56]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58][59] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[60] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[61] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[49]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[62] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[63] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[64] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[65] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[66] As of 2006, Google had an 85–90% market share in Germany.[67] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia, and the Czech Republic, where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[67]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[69][70]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[71][72]
{{cite web}}
: CS1 maint: multiple names: authors list (link)
This article needs additional citations for verification. (November 2009) |
Search analytics is the use of search data to investigate particular interactions among Web searchers, the search engine, or the content during searching episodes.[1] The resulting analysis and aggregation of search engine statistics can be used in search engine marketing (SEM) and search engine optimization (SEO). In other words, search analytics helps website owners understand and improve their performance on search engines based on the outcome. For example, identifying highly valuable site visitors[2] or understanding user intent.[3] Search analytics includes search volume trends and analysis, reverse searching (entering websites to see their keywords), keyword monitoring, search result and advertisement history, advertisement spending statistics, website comparisons, affiliate marketing statistics, multivariate ad testing, etc.[4]
Search analytics data can be collected in several ways. Search engines provide access to their own data with services such as Google Analytics,[5] Google Trends, and Google Insights. Third-party services must collect their data from ISP's, phoning home software, or from scraping search engines. Getting traffic statistics from ISP's and phone homes provides for broader reporting of web traffic in addition to search analytics. Services that perform keyword monitoring only scrape a limited set of search results, depending on their clients' needs. Services providing reverse search, however, must scrape a large set of keywords from the search engines, usually in the millions, to find the keywords that everyone is using.[6]
Since search results, especially advertisements, differ depending on where you are searching from, data collection methods have to account for geographic location. Keyword monitors do this more easily since they typically know what location their client is targeting. However, to get an exhaustive reverse search, several locations need to be scraped for the same keyword.
Search analytics accuracy depends on service being used, data collection method, and data freshness. Google releases its own data, but only in an aggregated way and often without assigning absolute values such as number of visitors to its graphs.[7] ISP logs and phone home methods are accurate for the population they sample, so sample size and demographics must be adequate to accurately represent the larger population. Scraping results can be highly accurate, especially when looking at the non-paid, organic search results. Paid results, from Google AdWords for example,[8] are often different for the same search depending on the time, geographic location, and history of searches from a particular computer. This means that scraping advertisers can be hit or miss.
Taking a look at Google Insights to gauge the popularity of these services shows that compared to searches for the term AdWords (Google's popular search ad system), use of search analytics services is still very low, around 1-25% as of Oct. 2009.[9] This could point to a large opportunity for the users and makers of search analytics given that services have existed since 2004 with several new services being started since.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Social media optimization (SMO) is the use of online platforms to generate income or publicity to increase the awareness of a brand, event, product or service. Types of social media involved include RSS feeds, blogging sites, social bookmarking sites, social news websites, video sharing websites such as Youtube and social networking sites such as Facebook, Instagram, Tiktok and X(Twitter). SMO is similar to search engine optimization (SEO) in that the goal is to drive web traffic, and draw attention to a company or creator. SMO's focal point is on gaining organic links to social media content. In contrast, SEO's core is about reaching the top of the search engine hierarchy.[1] In general, social media optimization refers to optimizing a website and its content to encourage more users to use and share links to the website across social media and networking sites.[2]
SMO is used to strategically create online content ranging from well-written text to eye-catching digital photos or video clips that encourages and entices people to engage with a website. Users share this content, via its weblink, with social media contacts and friends. Common examples of social media engagement are "liking and commenting on posts, retweeting, embedding, sharing, and promoting content".[3] Social media optimization is also an effective way of implementing online reputation management (ORM), meaning that if someone posts bad reviews of a business, an SMO strategy can ensure that the negative feedback is not the first link to come up in a list of search engine results.[4]
In the 2010s, with social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating web traffic. Publishers such as The Economist employ large social media teams to optimize their online posts and maximize traffic,[5] while other major publishers now use advanced artificial intelligence (AI) technology to generate higher volumes of web traffic.[6]
Social media optimization is an increasingly important factor in search engine optimization, which is the process of designing a website in a way so that it has as high a ranking as possible on search engines. Search engines are increasingly utilizing the recommendations of users of social networks such as Reddit, Facebook, Tumblr, Twitter, YouTube, LinkedIn, Pinterest and Instagram to rank pages in the search engine result pages.[7] The implication is that when a webpage is shared or "liked" by a user on a social network, it counts as a "vote" for that webpage's quality. Thus, search engines can use such votes accordingly to properly ranked websites in search engine results pages. Furthermore, since it is more difficult to tip the scales or influence the search engines in this way, search engines are putting more stock into social search.[7] This, coupled with increasingly personalized search based on interests and location, has significantly increased the importance of a social media presence in search engine optimization. Due to personalized search results, location-based social media presences on websites such as Yelp, Google Places, Foursquare, and Yahoo! Local have become increasingly important. While social media optimization is related to search engine marketing, it differs in several ways. Primarily, SMO focuses on driving web traffic from sources other than search engines, though improved search engine ranking is also a benefit of successful social media optimization. Further, SMO is helpful to target particular geographic regions in order to target and reach potential customers. This helps in lead generation (finding new customers) and contributes to high conversion rates (i.e., converting previously uninterested individuals into people who are interested in a brand or organization).
Social media optimization is in many ways connected to the technique of viral marketing or "viral seeding" where word of mouth is created through the use of networking in social bookmarking, video and photo sharing websites. An effective SMO campaign can harness the power of viral marketing; for example, 80% of activity on Pinterest is generated through "repinning."[citation needed] Furthermore, by following social trends and utilizing alternative social networks, websites can retain existing followers while also attracting new ones. This allows businesses to build an online following and presence, all linking back to the company's website for increased traffic. For example, with an effective social bookmarking campaign, not only can website traffic be increased, but a site's rankings can also be increased. In a similar way, the engagement with blogs creates a similar result by sharing content through the use of RSS in the blogosphere. Social media optimization is considered an integral part of an online reputation management (ORM) or search engine reputation management (SERM) strategy for organizations or individuals who care about their online presence.[8] SMO is one of six key influencers that affect Social Commerce Construct (SCC). Online activities such as consumers' evaluations and advices on products and services constitute part of what creates a Social Commerce Construct (SCC).[citation needed]
Social media optimization is not limited to marketing and brand building. Increasingly, smart businesses are integrating social media participation as part of their knowledge management strategy (i.e., product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more). Additionally, social media optimization can be implemented to foster a community of the associated site, allowing for a healthy business-to-consumer (B2C) relationship.[9]
According to technologist Danny Sullivan, the term "social media optimization" was first used and described by marketer Rohit Bhargava[10][11] on his marketing blog in August 2006. In the same post, Bhargava established the five important rules of social media optimization. Bhargava believed that by following his rules, anyone could influence the levels of traffic and engagement on their site, increase popularity, and ensure that it ranks highly in search engine results. An additional 11 SMO rules have since been added to the list by other marketing contributors.
The 16 rules of SMO, according to one source, are as follows:[12]
Bhargava's initial five rules were more specifically designed to SMO, while the list is now much broader and addresses everything that can be done across different social media platforms. According to author and CEO of TopRank Online Marketing, Lee Odden, a Social Media Strategy is also necessary to ensure optimization. This is a similar concept to Bhargava's list of rules for SMO.
The Social Media Strategy may consider:[13]
According to Lon Safko and David K. Brake in The Social Media Bible, it is also important to act like a publisher by maintaining an effective organizational strategy, to have an original concept and unique "edge" that differentiates one's approach from competitors, and to experiment with new ideas if things do not work the first time.[4] If a business is blog-based, an effective method of SMO is using widgets that allow users to share content to their personal social media platforms. This will ultimately reach a wider target audience and drive more traffic to the original post. Blog widgets and plug-ins for post-sharing are most commonly linked to Facebook, LinkedIn and x.com. They occasionally also link to social media platforms such as Tumblr and Pinterest. Many sharing widgets also include user counters which indicate how many times the content has been liked and shared across different social media pages. This can influence whether or not new users will engage with the post, and also gives businesses an idea of what kind of posts are most successful at engaging audiences. By using relevant and trending keywords in titles and throughout blog posts, a business can also increase search engine optimization and the chances of their content of being read and shared by a large audience.[13] The root of effective SMO is the content that is being posted, so professional content creation tools can be very beneficial. These can include editing programs such as Photoshop, GIMP, Final Cut Pro, and Dreamweaver. Many websites also offer customization options such as different layouts to personalize a page and create a point of difference.[4]
With social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating traffic. A report by Reuters Institute for the Study of Journalism described how a 'second wave of disruption' had hit news organizations,[14] with publishers such as The Economist having to employ large social media teams to optimize their posts, and maximize traffic.[5] Within the context of the publishing industry, even professional fields are utilizing SMO. Because doctors want to maximize exposure to their research findings SMO has also found a place in the medical field.[15]
Today, 3.8 billion people globally are using some form of social media.[citation needed] People frequently obtain health-related information from online social media platforms like Twitter and Facebook. Healthcare professionals and scientists can communicate with other medical-counterparts to discuss research and findings through social media platforms. These platforms provide researchers with data sets and surveillance that help detect patterns and behavior in preventing, informing, and studying global disease; COVID-19. Additionally, researchers utilize SMO to reach and recruit hard-to-reach patients. SMO narrows specified demographics that filter necessary data in a given study.[citation needed]
Social media gaming is online gaming activity performed through social media sites with friends and online gaming activity that promotes social media interaction. Examples of the former include FarmVille, Clash of Clans, Clash Royale, FrontierVille, and Mafia Wars. In these games a player's social network is exploited to recruit additional players and allies. An example of the latter is Empire Avenue, a virtual stock exchange where players buy and sell shares of each other's social network worth. Nielsen Media Research estimates that, as of June 2010, social networking and playing online games account for about one-third of all online activity by Americans.[16]
Facebook has in recent years become a popular channel for advertising, alongside traditional forms such as television, radio, and print. With over 1 billion active users, and 50% of those users logging into their accounts every day[17] it is an important communication platform that businesses can utilize and optimize to promote their brand and drive traffic to their websites. There are three commonly used strategies to increase advertising reach on Facebook:
Improving effectiveness and increasing network size are organic approaches, while buying more reach is a paid approach which does not require any further action.[18] Most businesses will attempt an "organic" approach to gaining a significant following before considering a paid approach. Because Facebook requires a login, it is important that posts are public to ensure they will reach the widest possible audience. Posts that have been heavily shared and interacted with by users are displayed as 'highlighted posts' at the top of newsfeeds. In order to achieve this status, the posts need to be engaging, interesting, or useful. This can be achieved by being spontaneous, asking questions, addressing current events and issues, and optimizing trending hashtags and keywords. The more engagement a post receives, the further it will spread and the more likely it is to feature on first in search results.
Another organic approach to Facebook optimization is cross-linking different social platforms. By posting links to websites or social media sites in the profile 'about' section, it is possible to direct traffic and ultimately increase search engine optimization. Another option is to share links to relevant videos and blog posts.[13] Facebook Connect is a functionality that launched in 2008 to allow Facebook users to sign up to different websites, enter competitions, and access exclusive promotions by logging in with their existing Facebook account details. This is beneficial to users as they don't have to create a new login every time they want to sign up to a website, but also beneficial to businesses as Facebook users become more likely to share their content. Often the two are interlinked, where in order to access parts of a website, a user has to like or share certain things on their personal profile or invite a number of friends to like a page. This can lead to greater traffic flow to a website as it reaches a wider audience. Businesses have more opportunities to reach their target markets if they choose a paid approach to SMO. When Facebook users create an account, they are urged to fill out their personal details such as gender, age, location, education, current and previous employers, religious and political views, interests, and personal preferences such as movie and music tastes. Facebook then takes this information and allows advertisers to use it to determine how to best market themselves to users that they know will be interested in their product. This can also be known as micro-targeting. If a user clicks on a link to like a page, it will show up on their profile and newsfeed. This then feeds back into organic social media optimization, as friends of the user will see this and be encouraged to click on the page themselves. Although advertisers are buying mass reach, they are attracting a customer base with a genuine interest in their product. Once a customer base has been established through a paid approach, businesses will often run promotions and competitions to attract more organic followers.[12]
The number of businesses that use Facebook to advertise also holds significant relevance. in 2017, there were three million businesses that advertised on Facebook.[19] This makes Facebook the world's largest platform for social media advertising. What also holds importance is the amount of money leading businesses are spending on Facebook advertising alone. Procter & Gamble spend $60 million every year on Facebook advertising.[20] Other advertisers on Facebook include Microsoft, with a yearly spend of £35 million, Amazon, Nestle and American Express all with yearly expenditures above £25 million per year.
Furthermore, the number of small businesses advertising on Facebook is of relevance. This number has grown rapidly over the upcoming years and demonstrates how important social media advertising actually is. Currently 70% of the UK's small businesses use Facebook advertising.[21] This is a substantial number of advertisers. Almost half of the world's small businesses use social media marketing product of some sort. This demonstrates the impact that social media has had on the current digital marketing era.
ER (Engagement Rate) represents the activity of users specific for a certain profile on Facebook, Instagram, Tiktok or any other Social Media. A common way to calculate it is the following:
In the above formula followers is the total number of followers (friends, subscribers, etc), interactions stands for the number of interactions, such as likes, comments, personal messages, shares. The latter is averaged over the certain period of time, which should normally be short enough to ensure the variance in followers number is negligible during this period.
Social media optimization.
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid traffic (known as "natural" or "organic" results) rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher on the search engine results page (SERP). These visitors can then potentially be converted into customers.[4]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider/crawler crawls a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[8][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[9] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[10]
By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[11] Since the success and popularity of a search engine are determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[12] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[13] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[14]
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[15][16] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[17] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[18]
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[19] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[20] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[21] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[22]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[23] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[24] Patents related to search engines can provide information to better understand search engines.[25] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[26]
In 2007, Google announced a campaign against paid links that transfer PageRank.[27] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[28] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[29]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[30] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[31] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[32]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[33] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[34] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[35] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[36] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[37] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[38] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[39] in addition to their URL submission console.[40] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[41] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[42]
Mobile devices are used for the majority of Google searches.[43] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[44] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[45] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[46]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47] In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint not a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[48]
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[49] Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[50] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[49]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[51] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[52]
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[15][16][53] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[54] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[55] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[56]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58][59] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[60] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[61] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[49]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[62] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[63] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[64] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[65] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[66] As of 2006, Google had an 85–90% market share in Germany.[67] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia, and the Czech Republic, where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[67]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[69][70]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[71][72]
{{cite web}}
: CS1 maint: multiple names: authors list (link)
This article needs additional citations for verification. (November 2009) |
Search analytics is the use of search data to investigate particular interactions among Web searchers, the search engine, or the content during searching episodes.[1] The resulting analysis and aggregation of search engine statistics can be used in search engine marketing (SEM) and search engine optimization (SEO). In other words, search analytics helps website owners understand and improve their performance on search engines based on the outcome. For example, identifying highly valuable site visitors[2] or understanding user intent.[3] Search analytics includes search volume trends and analysis, reverse searching (entering websites to see their keywords), keyword monitoring, search result and advertisement history, advertisement spending statistics, website comparisons, affiliate marketing statistics, multivariate ad testing, etc.[4]
Search analytics data can be collected in several ways. Search engines provide access to their own data with services such as Google Analytics,[5] Google Trends, and Google Insights. Third-party services must collect their data from ISP's, phoning home software, or from scraping search engines. Getting traffic statistics from ISP's and phone homes provides for broader reporting of web traffic in addition to search analytics. Services that perform keyword monitoring only scrape a limited set of search results, depending on their clients' needs. Services providing reverse search, however, must scrape a large set of keywords from the search engines, usually in the millions, to find the keywords that everyone is using.[6]
Since search results, especially advertisements, differ depending on where you are searching from, data collection methods have to account for geographic location. Keyword monitors do this more easily since they typically know what location their client is targeting. However, to get an exhaustive reverse search, several locations need to be scraped for the same keyword.
Search analytics accuracy depends on service being used, data collection method, and data freshness. Google releases its own data, but only in an aggregated way and often without assigning absolute values such as number of visitors to its graphs.[7] ISP logs and phone home methods are accurate for the population they sample, so sample size and demographics must be adequate to accurately represent the larger population. Scraping results can be highly accurate, especially when looking at the non-paid, organic search results. Paid results, from Google AdWords for example,[8] are often different for the same search depending on the time, geographic location, and history of searches from a particular computer. This means that scraping advertisers can be hit or miss.
Taking a look at Google Insights to gauge the popularity of these services shows that compared to searches for the term AdWords (Google's popular search ad system), use of search analytics services is still very low, around 1-25% as of Oct. 2009.[9] This could point to a large opportunity for the users and makers of search analytics given that services have existed since 2004 with several new services being started since.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Social media optimization (SMO) is the use of online platforms to generate income or publicity to increase the awareness of a brand, event, product or service. Types of social media involved include RSS feeds, blogging sites, social bookmarking sites, social news websites, video sharing websites such as Youtube and social networking sites such as Facebook, Instagram, Tiktok and X(Twitter). SMO is similar to search engine optimization (SEO) in that the goal is to drive web traffic, and draw attention to a company or creator. SMO's focal point is on gaining organic links to social media content. In contrast, SEO's core is about reaching the top of the search engine hierarchy.[1] In general, social media optimization refers to optimizing a website and its content to encourage more users to use and share links to the website across social media and networking sites.[2]
SMO is used to strategically create online content ranging from well-written text to eye-catching digital photos or video clips that encourages and entices people to engage with a website. Users share this content, via its weblink, with social media contacts and friends. Common examples of social media engagement are "liking and commenting on posts, retweeting, embedding, sharing, and promoting content".[3] Social media optimization is also an effective way of implementing online reputation management (ORM), meaning that if someone posts bad reviews of a business, an SMO strategy can ensure that the negative feedback is not the first link to come up in a list of search engine results.[4]
In the 2010s, with social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating web traffic. Publishers such as The Economist employ large social media teams to optimize their online posts and maximize traffic,[5] while other major publishers now use advanced artificial intelligence (AI) technology to generate higher volumes of web traffic.[6]
Social media optimization is an increasingly important factor in search engine optimization, which is the process of designing a website in a way so that it has as high a ranking as possible on search engines. Search engines are increasingly utilizing the recommendations of users of social networks such as Reddit, Facebook, Tumblr, Twitter, YouTube, LinkedIn, Pinterest and Instagram to rank pages in the search engine result pages.[7] The implication is that when a webpage is shared or "liked" by a user on a social network, it counts as a "vote" for that webpage's quality. Thus, search engines can use such votes accordingly to properly ranked websites in search engine results pages. Furthermore, since it is more difficult to tip the scales or influence the search engines in this way, search engines are putting more stock into social search.[7] This, coupled with increasingly personalized search based on interests and location, has significantly increased the importance of a social media presence in search engine optimization. Due to personalized search results, location-based social media presences on websites such as Yelp, Google Places, Foursquare, and Yahoo! Local have become increasingly important. While social media optimization is related to search engine marketing, it differs in several ways. Primarily, SMO focuses on driving web traffic from sources other than search engines, though improved search engine ranking is also a benefit of successful social media optimization. Further, SMO is helpful to target particular geographic regions in order to target and reach potential customers. This helps in lead generation (finding new customers) and contributes to high conversion rates (i.e., converting previously uninterested individuals into people who are interested in a brand or organization).
Social media optimization is in many ways connected to the technique of viral marketing or "viral seeding" where word of mouth is created through the use of networking in social bookmarking, video and photo sharing websites. An effective SMO campaign can harness the power of viral marketing; for example, 80% of activity on Pinterest is generated through "repinning."[citation needed] Furthermore, by following social trends and utilizing alternative social networks, websites can retain existing followers while also attracting new ones. This allows businesses to build an online following and presence, all linking back to the company's website for increased traffic. For example, with an effective social bookmarking campaign, not only can website traffic be increased, but a site's rankings can also be increased. In a similar way, the engagement with blogs creates a similar result by sharing content through the use of RSS in the blogosphere. Social media optimization is considered an integral part of an online reputation management (ORM) or search engine reputation management (SERM) strategy for organizations or individuals who care about their online presence.[8] SMO is one of six key influencers that affect Social Commerce Construct (SCC). Online activities such as consumers' evaluations and advices on products and services constitute part of what creates a Social Commerce Construct (SCC).[citation needed]
Social media optimization is not limited to marketing and brand building. Increasingly, smart businesses are integrating social media participation as part of their knowledge management strategy (i.e., product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more). Additionally, social media optimization can be implemented to foster a community of the associated site, allowing for a healthy business-to-consumer (B2C) relationship.[9]
According to technologist Danny Sullivan, the term "social media optimization" was first used and described by marketer Rohit Bhargava[10][11] on his marketing blog in August 2006. In the same post, Bhargava established the five important rules of social media optimization. Bhargava believed that by following his rules, anyone could influence the levels of traffic and engagement on their site, increase popularity, and ensure that it ranks highly in search engine results. An additional 11 SMO rules have since been added to the list by other marketing contributors.
The 16 rules of SMO, according to one source, are as follows:[12]
Bhargava's initial five rules were more specifically designed to SMO, while the list is now much broader and addresses everything that can be done across different social media platforms. According to author and CEO of TopRank Online Marketing, Lee Odden, a Social Media Strategy is also necessary to ensure optimization. This is a similar concept to Bhargava's list of rules for SMO.
The Social Media Strategy may consider:[13]
According to Lon Safko and David K. Brake in The Social Media Bible, it is also important to act like a publisher by maintaining an effective organizational strategy, to have an original concept and unique "edge" that differentiates one's approach from competitors, and to experiment with new ideas if things do not work the first time.[4] If a business is blog-based, an effective method of SMO is using widgets that allow users to share content to their personal social media platforms. This will ultimately reach a wider target audience and drive more traffic to the original post. Blog widgets and plug-ins for post-sharing are most commonly linked to Facebook, LinkedIn and x.com. They occasionally also link to social media platforms such as Tumblr and Pinterest. Many sharing widgets also include user counters which indicate how many times the content has been liked and shared across different social media pages. This can influence whether or not new users will engage with the post, and also gives businesses an idea of what kind of posts are most successful at engaging audiences. By using relevant and trending keywords in titles and throughout blog posts, a business can also increase search engine optimization and the chances of their content of being read and shared by a large audience.[13] The root of effective SMO is the content that is being posted, so professional content creation tools can be very beneficial. These can include editing programs such as Photoshop, GIMP, Final Cut Pro, and Dreamweaver. Many websites also offer customization options such as different layouts to personalize a page and create a point of difference.[4]
With social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating traffic. A report by Reuters Institute for the Study of Journalism described how a 'second wave of disruption' had hit news organizations,[14] with publishers such as The Economist having to employ large social media teams to optimize their posts, and maximize traffic.[5] Within the context of the publishing industry, even professional fields are utilizing SMO. Because doctors want to maximize exposure to their research findings SMO has also found a place in the medical field.[15]
Today, 3.8 billion people globally are using some form of social media.[citation needed] People frequently obtain health-related information from online social media platforms like Twitter and Facebook. Healthcare professionals and scientists can communicate with other medical-counterparts to discuss research and findings through social media platforms. These platforms provide researchers with data sets and surveillance that help detect patterns and behavior in preventing, informing, and studying global disease; COVID-19. Additionally, researchers utilize SMO to reach and recruit hard-to-reach patients. SMO narrows specified demographics that filter necessary data in a given study.[citation needed]
Social media gaming is online gaming activity performed through social media sites with friends and online gaming activity that promotes social media interaction. Examples of the former include FarmVille, Clash of Clans, Clash Royale, FrontierVille, and Mafia Wars. In these games a player's social network is exploited to recruit additional players and allies. An example of the latter is Empire Avenue, a virtual stock exchange where players buy and sell shares of each other's social network worth. Nielsen Media Research estimates that, as of June 2010, social networking and playing online games account for about one-third of all online activity by Americans.[16]
Facebook has in recent years become a popular channel for advertising, alongside traditional forms such as television, radio, and print. With over 1 billion active users, and 50% of those users logging into their accounts every day[17] it is an important communication platform that businesses can utilize and optimize to promote their brand and drive traffic to their websites. There are three commonly used strategies to increase advertising reach on Facebook:
Improving effectiveness and increasing network size are organic approaches, while buying more reach is a paid approach which does not require any further action.[18] Most businesses will attempt an "organic" approach to gaining a significant following before considering a paid approach. Because Facebook requires a login, it is important that posts are public to ensure they will reach the widest possible audience. Posts that have been heavily shared and interacted with by users are displayed as 'highlighted posts' at the top of newsfeeds. In order to achieve this status, the posts need to be engaging, interesting, or useful. This can be achieved by being spontaneous, asking questions, addressing current events and issues, and optimizing trending hashtags and keywords. The more engagement a post receives, the further it will spread and the more likely it is to feature on first in search results.
Another organic approach to Facebook optimization is cross-linking different social platforms. By posting links to websites or social media sites in the profile 'about' section, it is possible to direct traffic and ultimately increase search engine optimization. Another option is to share links to relevant videos and blog posts.[13] Facebook Connect is a functionality that launched in 2008 to allow Facebook users to sign up to different websites, enter competitions, and access exclusive promotions by logging in with their existing Facebook account details. This is beneficial to users as they don't have to create a new login every time they want to sign up to a website, but also beneficial to businesses as Facebook users become more likely to share their content. Often the two are interlinked, where in order to access parts of a website, a user has to like or share certain things on their personal profile or invite a number of friends to like a page. This can lead to greater traffic flow to a website as it reaches a wider audience. Businesses have more opportunities to reach their target markets if they choose a paid approach to SMO. When Facebook users create an account, they are urged to fill out their personal details such as gender, age, location, education, current and previous employers, religious and political views, interests, and personal preferences such as movie and music tastes. Facebook then takes this information and allows advertisers to use it to determine how to best market themselves to users that they know will be interested in their product. This can also be known as micro-targeting. If a user clicks on a link to like a page, it will show up on their profile and newsfeed. This then feeds back into organic social media optimization, as friends of the user will see this and be encouraged to click on the page themselves. Although advertisers are buying mass reach, they are attracting a customer base with a genuine interest in their product. Once a customer base has been established through a paid approach, businesses will often run promotions and competitions to attract more organic followers.[12]
The number of businesses that use Facebook to advertise also holds significant relevance. in 2017, there were three million businesses that advertised on Facebook.[19] This makes Facebook the world's largest platform for social media advertising. What also holds importance is the amount of money leading businesses are spending on Facebook advertising alone. Procter & Gamble spend $60 million every year on Facebook advertising.[20] Other advertisers on Facebook include Microsoft, with a yearly spend of £35 million, Amazon, Nestle and American Express all with yearly expenditures above £25 million per year.
Furthermore, the number of small businesses advertising on Facebook is of relevance. This number has grown rapidly over the upcoming years and demonstrates how important social media advertising actually is. Currently 70% of the UK's small businesses use Facebook advertising.[21] This is a substantial number of advertisers. Almost half of the world's small businesses use social media marketing product of some sort. This demonstrates the impact that social media has had on the current digital marketing era.
ER (Engagement Rate) represents the activity of users specific for a certain profile on Facebook, Instagram, Tiktok or any other Social Media. A common way to calculate it is the following:
In the above formula followers is the total number of followers (friends, subscribers, etc), interactions stands for the number of interactions, such as likes, comments, personal messages, shares. The latter is averaged over the certain period of time, which should normally be short enough to ensure the variance in followers number is negligible during this period.
Social media optimization.